AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Self-supervised Pre-training

# Self-supervised Pre-training

Lwm
LWM is the first foundational model in the field of wireless communications, developed as a universal feature extractor capable of extracting fine-grained representations from wireless channel data.
Physics Model Transformers
L
wi-lab
137
3
Mert Base
MERT is an acoustic music understanding model based on self-supervised learning, using pseudo-labels provided by a teacher model for pre-training.
Audio Classification Transformers
M
yangwang825
26
0
Wavlm Base
WavLM is a large-scale self-supervised pre-trained speech model developed by Microsoft, pre-trained on 16kHz sampled speech audio, suitable for full-stack speech processing tasks.
Speech Recognition Transformers English
W
microsoft
28.33k
7
Tapas Tiny
Apache-2.0
TAPAS is a Transformer-based table question answering model pre-trained in a self-supervised manner on English Wikipedia table data, supporting table QA and entailment tasks.
Large Language Model Transformers English
T
google
44
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase